NN and LLM
Everything starts with NN's like linear percepton, and then things get a bit more medium specific like Embeddings for text and CNNs for images, and finally we come all of the way back around to LLM's for multi-model and cross medium use cases
NN's
TODO: Copy over old notes
LLMs
Transformers
Everything started with Transformers for textual embeddings (IMO)